On the number of spurious memories in the Hopfield model
نویسندگان
چکیده
Ahstruet-It is shown that the outer-product method for programming the Hopfield model is shown, which can result in many spurious stable states-exponential in the number of vectors that we want to store-even in the case when the vectors are orthogonal.
منابع مشابه
Reflexive Associative Memories
In the synchronous discrete model, the average memory capacity of bidirectional associative memories (BAMs) is compared with that of Hopfield memories, by means of a calculat10n of the percentage of good recall for 100 random BAMs of dimension 64x64, for different numbers of stored vectors. The memory capac1ty Is found to be much smal1er than the Kosko upper bound, which Is the lesser of the tw...
متن کاملImproved Hopfield Networks by Training with Noisy Data
A new approach to training a generalized Hopfield network is developed and evaluated in this work. Both the weight symmetricity constraint and the zero selfconnection constraint are removed from standard Hopfield networks. Training is accomplished with BackPropagation Through Time, using noisy versions of the memorized patterns. Training in this way is referred to as Noisy Associative Training ...
متن کاملValidation in Distributed Representations
In the Hopfield model of content addressable memory, the number of spurious attractors is exponential in the dimensionality of the memory. Hence it is highly ’ikely that the system converges to a spurious memory on ar arbitrary input. It is desirable that the system has a way c:f checking whether its state corresponds to any one of the stored patterns. In this paper we show that it is possible ...
متن کاملHigh-Capacity Quantum Associative Memories
We review our models of quantum associative memories that represent the “quantization” of fully coupled neural networks like the Hopfield model. The idea is to replace the classical irreversible attractor dynamics driven by an Ising model with pattern-dependent weights by the reversible rotation of an input quantum state onto an output quantum state consisting of a linear superposition with pro...
متن کاملMemory Dynamics in Attractor Networks with Saliency Weights
Memory is a fundamental part of computational systems like the human brain. Theoretical models identify memories as attractors of neural network activity patterns based on the theory that attractor (recurrent) neural networks are able to capture some crucial characteristics of memory, such as encoding, storage, retrieval, and long-term and working memory. In such networks, long-term storage of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Information Theory
دوره 36 شماره
صفحات -
تاریخ انتشار 1990